6,464 research outputs found
On the Complexity and Performance of Parsing with Derivatives
Current algorithms for context-free parsing inflict a trade-off between ease
of understanding, ease of implementation, theoretical complexity, and practical
performance. No algorithm achieves all of these properties simultaneously.
Might et al. (2011) introduced parsing with derivatives, which handles
arbitrary context-free grammars while being both easy to understand and simple
to implement. Despite much initial enthusiasm and a multitude of independent
implementations, its worst-case complexity has never been proven to be better
than exponential. In fact, high-level arguments claiming it is fundamentally
exponential have been advanced and even accepted as part of the folklore.
Performance ended up being sluggish in practice, and this sluggishness was
taken as informal evidence of exponentiality.
In this paper, we reexamine the performance of parsing with derivatives. We
have discovered that it is not exponential but, in fact, cubic. Moreover,
simple (though perhaps not obvious) modifications to the implementation by
Might et al. (2011) lead to an implementation that is not only easy to
understand but also highly performant in practice.Comment: 13 pages; 12 figures; implementation at
http://bitbucket.org/ucombinator/parsing-with-derivatives/ ; published in
PLDI '16, Proceedings of the 37th ACM SIGPLAN Conference on Programming
Language Design and Implementation, June 13 - 17, 2016, Santa Barbara, CA,
US
Comprehensive experimental analysis of nonlinear dynamics in an optically-injected semiconductor laser
We present the first comprehensive experimental study, to our knowledge, of the routes between nonlinear dynamics induced in a semiconductor laser under external optical injection based on an analysis of time-averaged measurements of the optical and RF spectra and phasors of real-time series of the laser output. The different means of analysis are compared for several types of routes and the benefits of each are discussed in terms of the identification and mapping of the nonlinear dynamics. Finally, the results are presented in a novel audio/video format that describes the evolution of the dynamics with the injection parameters. © 2011 Author(s)
An Improved Progressive Lossy-to-Lossless Coding Method for Arbitrarily-Sampled Image Data
Abstract—A method for the progressive lossy-to-lossless coding of arbitrarily-sampled image data is proposed. Through experimental results, the proposed method is demonstrated to have a rate-distortion performance that is vastly superior to that of the state-of-the-art image-tree (IT) coding scheme. In particular, at intermediate rates (i.e., in progressive decoding scenarios),theproposedmethodyieldsimagereconstructionswith a peak-signal-to-noise ratio that is much higher (sometimes by several dB) than the IT scheme, while simultaneously achieving a slightly lower lossless rate. I
Pushdown Control-Flow Analysis for Free
Traditional control-flow analysis (CFA) for higher-order languages, whether
implemented by constraint-solving or abstract interpretation, introduces
spurious connections between callers and callees. Two distinct invocations of a
function will necessarily pollute one another's return-flow. Recently, three
distinct approaches have been published which provide perfect call-stack
precision in a computable manner: CFA2, PDCFA, and AAC. Unfortunately, CFA2 and
PDCFA are difficult to implement and require significant engineering effort.
Furthermore, all three are computationally expensive; for a monovariant
analysis, CFA2 is in , PDCFA is in , and AAC is in .
In this paper, we describe a new technique that builds on these but is both
straightforward to implement and computationally inexpensive. The crucial
insight is an unusual state-dependent allocation strategy for the addresses of
continuation. Our technique imposes only a constant-factor overhead on the
underlying analysis and, with monovariance, costs only O(n3) in the worst case.
This paper presents the intuitions behind this development, a proof of the
precision of this analysis, and benchmarks demonstrating its efficacy.Comment: in Proceedings of the 43rd Annual ACM SIGPLAN-SIGACT Symposium on
Principles of Programming Languages, 201
New Public Management And Service Science: Preparing Students For The 21st Century
This effort will seek to discover the foundations of public management and how it relates to information technology, specifically understanding E-government and other technologies in the public domain. Public Management is the development or application of methodical and systematic techniques, often employing comparison quantification, and measurement, that are designed to make the operations of public organizations more efficient, effective and increasingly responsive. This is a considerably crisper, concise, and narrow definition compared to other definitions of public administration, and its sharper focus is attributable to the larger fields encompassing of values in addition to those of efficiency, effectiveness and responsiveness
A Simulation Study Of The Impact Of Forecast Recovery For Control Charts Applied To ARMA Processes
Forecast-based schemes are often used to monitor autocorrelated processes, but the resulting forecast recovery has a significant effect on the performance of control charts. This article describes forecast recovery for autocorrelated processes, and the resulting simulation study is used to explain the performance of control charts applied to forecast errors
- …